Modeling a Two-Input Nonlinear Function¶
1. Task Definition¶
The task is to construct an ANFIS model that approximates the two-input nonlinear sinc function, defined as:
$$z=sinc(x,y)=\frac{sin(x)}{x}\times\frac{sin(y)}{y}$$
2. Dataset Generation¶
The dataset consists of 100 input-output data pairs. These pairs are generated from the grid points of the input space defined by the range $[-10,10] \times [-10,10]$.
In [6]:
Copied!
import numpy as np
x = np.linspace(-10, 10, 100)
y = np.linspace(-10, 10, 100)
x, y = np.meshgrid(x, y)
z = np.sinc(x) * np.sinc(y)
X_train, y_train = np.c_[x.ravel(), y.ravel()], z.ravel()
import numpy as np
x = np.linspace(-10, 10, 100)
y = np.linspace(-10, 10, 100)
x, y = np.meshgrid(x, y)
z = np.sinc(x) * np.sinc(y)
X_train, y_train = np.c_[x.ravel(), y.ravel()], z.ravel()
2) Build, train, and evaluate ANFIS¶
Instantiate ANFISRegressor with Gaussian membership functions and the hybrid trainer:
- membership functions are inferred directly from the data;
fittunes both antecedent and consequent parameters;- we call
evaluateon the fitted low-level model for a compact metric report.
In [25]:
Copied!
from anfis_toolbox import ANFISRegressor
model = ANFISRegressor(n_mfs=10, optimizer="hybrid_adam", epochs=100, learning_rate=1e-3, init="fcm")
model.fit(X_train, y_train)
_ = model.evaluate(X_train, y_train, print_results=True)
from anfis_toolbox import ANFISRegressor
model = ANFISRegressor(n_mfs=10, optimizer="hybrid_adam", epochs=100, learning_rate=1e-3, init="fcm")
model.fit(X_train, y_train)
_ = model.evaluate(X_train, y_train, print_results=True)
Epoch 1 - train_loss: 0.001527
Epoch 2 - train_loss: 0.001527
Epoch 3 - train_loss: 0.001526
Epoch 4 - train_loss: 0.001526
Epoch 5 - train_loss: 0.001525
Epoch 6 - train_loss: 0.001525
Epoch 7 - train_loss: 0.001524
Epoch 8 - train_loss: 0.001523
Epoch 9 - train_loss: 0.001523
Epoch 10 - train_loss: 0.001522
Epoch 11 - train_loss: 0.001522
Epoch 12 - train_loss: 0.001521
Epoch 13 - train_loss: 0.001521
Epoch 14 - train_loss: 0.001520
Epoch 15 - train_loss: 0.001520
Epoch 16 - train_loss: 0.001519
Epoch 17 - train_loss: 0.001519
Epoch 18 - train_loss: 0.001518
Epoch 19 - train_loss: 0.001518
Epoch 20 - train_loss: 0.001517
Epoch 21 - train_loss: 0.001517
Epoch 22 - train_loss: 0.001516
Epoch 23 - train_loss: 0.001516
Epoch 24 - train_loss: 0.001515
Epoch 25 - train_loss: 0.001514
Epoch 26 - train_loss: 0.001514
Epoch 27 - train_loss: 0.001513
Epoch 28 - train_loss: 0.001513
Epoch 29 - train_loss: 0.001512
Epoch 30 - train_loss: 0.001512
Epoch 31 - train_loss: 0.001511
Epoch 32 - train_loss: 0.001511
Epoch 33 - train_loss: 0.001510
Epoch 34 - train_loss: 0.001510
Epoch 35 - train_loss: 0.001509
Epoch 36 - train_loss: 0.001509
Epoch 37 - train_loss: 0.001508
Epoch 38 - train_loss: 0.001508
Epoch 39 - train_loss: 0.001507
Epoch 40 - train_loss: 0.001507
Epoch 41 - train_loss: 0.001506
Epoch 42 - train_loss: 0.001505
Epoch 43 - train_loss: 0.001505
Epoch 44 - train_loss: 0.001504
Epoch 45 - train_loss: 0.001504
Epoch 46 - train_loss: 0.001503
Epoch 47 - train_loss: 0.001503
Epoch 48 - train_loss: 0.001502
Epoch 49 - train_loss: 0.001502
Epoch 50 - train_loss: 0.001501
Epoch 51 - train_loss: 0.001501
Epoch 52 - train_loss: 0.001500
Epoch 53 - train_loss: 0.001500
Epoch 54 - train_loss: 0.001499
Epoch 55 - train_loss: 0.001499
Epoch 56 - train_loss: 0.001498
Epoch 57 - train_loss: 0.001498
Epoch 58 - train_loss: 0.001497
Epoch 59 - train_loss: 0.001496
Epoch 60 - train_loss: 0.001496
Epoch 61 - train_loss: 0.001495
Epoch 62 - train_loss: 0.001495
Epoch 63 - train_loss: 0.001494
Epoch 64 - train_loss: 0.001494
Epoch 65 - train_loss: 0.001493
Epoch 66 - train_loss: 0.001493
Epoch 67 - train_loss: 0.001492
Epoch 68 - train_loss: 0.001492
Epoch 69 - train_loss: 0.001491
Epoch 70 - train_loss: 0.001491
Epoch 71 - train_loss: 0.001490
Epoch 72 - train_loss: 0.001490
Epoch 73 - train_loss: 0.001489
Epoch 74 - train_loss: 0.001488
Epoch 75 - train_loss: 0.001488
Epoch 76 - train_loss: 0.001487
Epoch 77 - train_loss: 0.001487
Epoch 78 - train_loss: 0.001486
Epoch 79 - train_loss: 0.001486
Epoch 80 - train_loss: 0.001485
Epoch 81 - train_loss: 0.001485
Epoch 82 - train_loss: 0.001484
Epoch 83 - train_loss: 0.001484
Epoch 84 - train_loss: 0.001483
Epoch 85 - train_loss: 0.001482
Epoch 86 - train_loss: 0.001482
Epoch 87 - train_loss: 0.001481
Epoch 88 - train_loss: 0.001481
Epoch 89 - train_loss: 0.001480
Epoch 90 - train_loss: 0.001480
Epoch 91 - train_loss: 0.001479
Epoch 92 - train_loss: 0.001479
Epoch 93 - train_loss: 0.001478
Epoch 94 - train_loss: 0.001477
Epoch 95 - train_loss: 0.001477
Epoch 96 - train_loss: 0.001476
Epoch 97 - train_loss: 0.001476
Epoch 98 - train_loss: 0.001475
Epoch 99 - train_loss: 0.001475
Epoch 100 - train_loss: 0.001474
ANFISRegressor evaluation:
MSE: 0.001473
RMSE: 0.038386
MAE: 0.012121
R2: 0.384851
3) Visualize predictions¶
In [23]:
Copied!
z_predict = model.predict(X_train).reshape(z.shape)
z_predict = model.predict(X_train).reshape(z.shape)
In [24]:
Copied!
plot_surface(x, y, z_predict)
plot_surface(x, y, z_predict)
In [ ]:
Copied!